9 research outputs found

    Hybrid Algorithm for the Detection of Lung Cancer Using CNN and Image Segmentation

    Get PDF
    When it comes to cancer-related fatalities, lung cancer is by far the most common cause. Early detection is the key to a successful diagnosis and treatment plan for lung cancer, just as it is for other types of cancer. Automatic CAD systems for lung cancer screening using computed tomography scans primarily involve two steps: the first step is to detect all potentially malignant pulmonary nodules, and the second step is to determine whether or not the nodules are malignant. There have been a lot of books published recently about the first phase, but not many about the second stage. Screening for lung cancer requires a careful investigation on each suspicious nodule and the integration of information from all nodules. This is because the presence of pulmonary nodules does not always indicate cancer, and the morphology of nodules, including their shape, size, and contextual information, has a complex relationship with cancer. In order to overcome this problem, we suggest a deep CNN architecture that is different from the architectures that are commonly utilised in computer vision. After the suspicious nodules have been formed with the modified version of U-Net, they are used as an input data for our model. First, the suspicious nodules are generated with U-Net. To automatically diagnose lung cancer, the suggested model is a multi-path CNN that concurrently makes use of local characteristics as well as more general contextual characteristics from a wider geographical area. In order to accomplish this, the model consisted of three separate pathways, each of which used a different receptive field size, which contributed to the modelling of distant dependencies (short and long-range dependencies of the neighbouring pixels). After that, we concatenate characteristics from the three different pathways in order to further improve our model's performance. In conclusion, one of the contributions that we have made is the development of a retraining phase system. This system enables us to address issues that are caused by an uneven distribution of picture labels. The experimental findings from the KDSB 2017 challenge demonstrate that our model is more adaptable to the described inconsistency among the nodules' sizes and shapes. Furthermore, our model obtained better results in comparison to other researches

    Message queue telemetry transport and lightweight machine-to-machine comparison based on performance efficiency under various scenarios

    Get PDF
    Internet of things (IoT) is been advancing over a long period of time in many aspects. For data transfer between IoT devices in a wireless sensor network, various IoT protocols are proposed. Among them, the most widely used are constrained application protocol (CoAP) and message queue telemetry transport (MQTT). Overcoming the limitations of CoAP, lightweight machine-to-machine (LwM2M) framework was designed above CoAP. Recent statistics show that LwM2M and MQTT are the widely used, but LwM2M is still less used than MQTT. Our paper is aimed at comparing both MQTT and LwM2M on the basis of performance efficiency, which will be achieved by sending same file through both protocols to the server. Performance efficiency will be calculated in two scenarios, i) when the client makes a connection with the server i.e., while initial connection and ii) while sending data file to server i.e., while data transfer. Both the protocols will be tested on the number of packets sent and the variability of packet size throughout the session. Experimental results indicated that LwM2M outperformed MQTT in both above scenarios by almost 69%. Therefore, we concluded by stating that LwM2M is best choice over MQTT, but MQTT can still be used in some situations if necessary

    Rational by design: The effects of social media algorithms on human behaviour and self-identity in India.

    No full text
    Rational algorithms exist all around us and are ingrained deeply in our lives. They claim to help us connect with people, information, and even our supposed future lovers. Although algorithms achieve most of their goals accurately, and humans benefit greatly from them, we as behavioural scientists cannot help but question at what cost? We acknowledge that these algorithms have simplified human life by reducing choice overload, but sometimes they also lead people to unfavourable long-term outcomes. We theorise this to be a result of the algorithms being designed for the ’Homoeconomicus’ rather than homo sapiens.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/177342/1/21-Singh-Rational-Social Media and Society in India Proceedings-116-124-10.73027939.pdfSEL

    Microbiome disturbance and resilience dynamics of the upper respiratory tract during influenza A virus infection

    No full text
    Infection with influenza can be aggravated by bacterial co-infections, which often results in disease exacerbation. The effects of influenza infection on the upper respiratory tract (URT) microbiome are largely unknown. Here, we report a longitudinal study to assess the temporal dynamics of the URT microbiomes of uninfected and influenza virus-infected humans and ferrets. Uninfected human patients and ferret URT microbiomes have stable healthy ecostate communities both within and between individuals. In contrast, infected patients and ferrets exhibit large changes in bacterial community composition over time and between individuals. The unhealthy ecostates of infected individuals progress towards the healthy ecostate, coinciding with viral clearance and recovery. Pseudomonadales associate statistically with the disturbed microbiomes of infected individuals. The dynamic and resilient microbiome during influenza virus infection in multiple hosts provides a compelling rationale for the maintenance of the microbiome homeostasis as a potential therapeutic target to prevent IAV associated bacterial co-infections
    corecore